2,314 research outputs found

    Partisipasi Perempuan Dalam Pengelolaan Lingkungan Hidup

    Full text link
    Women have strong links with the environment. In his role as manager of the household, they are more interacting with the environment and natural resources. The era of globalization has indirectly resulted in increasing the development of environmental pollution and environmental damage. Women and children are part of the citizens who have a direct impact of pollution. Women's health impaired due to unhealthy living environment will result indirectly on the health of children as the future generation. Law No. 32 of 2009 on the Protection and Management of Living (UUPPLH) has given the changes that are democratic; give broad authority to local governments and community involvement in giving priority to control environmental damage. The role of women is implicitly described in article 70 of the role of the community. The role of women in environmental protection under the act protection and environmental management can be a social control and the role of women in environmental policy-making through the legislature

    The Stellar Dynamics of Omega Centauri

    Full text link
    The stellar dynamics of Omega Centauri are inferred from the radial velocities of 469 stars measured with CORAVEL (Mayor et al. 1997). Rather than fit the data to a family of models, we generate estimates of all dynamical functions nonparametrically, by direct operation on the data. The cluster is assumed to be oblate and edge-on but mass is not assumed to follow light. The mean motions are consistent with axisymmetry but the rotation is not cylindrical. The peak rotational velocity is 7.9 km/s at 11 pc from the center. The apparent rotation of Omega Centauri is attributable in part to its proper motion. We reconstruct the stellar velocity ellipsoid as a function of position, assuming isotropy in the meridional plane. We find no significant evidence for a difference between the velocity dispersions parallel and perpendicular to the meridional plane. The mass distribution inferred from the kinematics is slightly more extended than, though not strongly inconsistent with, the luminosity distribution. We also derive the two-integral distribution function f(E,Lz) implied by the velocity data.Comment: 25 Latex pages, 12 Postscript figures, uses aastex, epsf.sty. Submitted to The Astronomical Journal, December 199

    Interpolating point spread function anisotropy

    Full text link
    Planned wide-field weak lensing surveys are expected to reduce the statistical errors on the shear field to unprecedented levels. In contrast, systematic errors like those induced by the convolution with the point spread function (PSF) will not benefit from that scaling effect and will require very accurate modeling and correction. While numerous methods have been devised to carry out the PSF correction itself, modeling of the PSF shape and its spatial variations across the instrument field of view has, so far, attracted much less attention. This step is nevertheless crucial because the PSF is only known at star positions while the correction has to be performed at any position on the sky. A reliable interpolation scheme is therefore mandatory and a popular approach has been to use low-order bivariate polynomials. In the present paper, we evaluate four other classical spatial interpolation methods based on splines (B-splines), inverse distance weighting (IDW), radial basis functions (RBF) and ordinary Kriging (OK). These methods are tested on the Star-challenge part of the GRavitational lEnsing Accuracy Testing 2010 (GREAT10) simulated data and are compared with the classical polynomial fitting (Polyfit). We also test all our interpolation methods independently of the way the PSF is modeled, by interpolating the GREAT10 star fields themselves (i.e., the PSF parameters are known exactly at star positions). We find in that case RBF to be the clear winner, closely followed by the other local methods, IDW and OK. The global methods, Polyfit and B-splines, are largely behind, especially in fields with (ground-based) turbulent PSFs. In fields with non-turbulent PSFs, all interpolators reach a variance on PSF systematics σsys2\sigma_{sys}^2 better than the 1×10−71\times10^{-7} upper bound expected by future space-based surveys, with the local interpolators performing better than the global ones

    Perilaku Metakognisi Siswa Dalam Menyelesaikan Masalah Kimia

    Full text link
    This study describes the metacognition behavior of students on problem solving in degree of acidity calculation. This research was conducted at SMAN 1 Pamona Utara and at SMA GKST 2 Pamona Puselemba. Research subjects were 6 students of class XII IPA determined based on the respondents test selection into high, middle and low categories. This study is a qualitative research. Research data were as test results, thinking-aloud transcripts, and interviews as well as the results of the inventory questionnaire metacognition activity. The data were described and tested using the triangulation method. Results of thinking-aloud data analysis showed that the high and the middle categories were able to assess the mistakes. The high category tend to monitor mistakes while making calculations. The middle category emphasized on parts of calculations to avoid mistakes and justify the calculations and the solutions. The middle category also tend to check the problem solving steps. The low category checked the answers. Results of the questionnaires showed that the average metacognition activity of the middle category was the highest, and the low category was the lowest. The lack of students' knowledge on the concepts and principles of the problem caused students could not solve the problem either. The highest level of students' metacognition in this study was the strategic use

    Firedec: a two-channel finite-resolution image deconvolution algorithm

    Full text link
    We present a two-channel deconvolution method that decomposes images into a parametric point-source channel and a pixelized extended-source channel. Based on the central idea of the deconvolution algorithm proposed by Magain, Courbin & Sohy (1998), the method aims at improving the resolution of the data rather than at completely removing the point spread function (PSF). Improvements over the original method include a better regularization of the pixel channel of the image, based on wavelet filtering and multiscale analysis, and a better controlled separation of the point source vs. the extended source. In addition, the method is able to simultaneously deconvolve many individual frames of the same object taken with different instruments under different PSF conditions. For this purpose, we introduce a general geometric transformation between individual images. This transformation allows the combination of the images without having to interpolate them. We illustrate the capability of our algorithm using real and simulated images with complex diffraction-limited PSF.Comment: Accepted in A&A. An application of the technique to real data is available in Cantale et al. http://arxiv.org/abs/1601.05192v

    Fornax compact object survey FCOS: On the nature of Ultra Compact Dwarf galaxies

    Full text link
    The results of the Fornax Compact Object Survey (FCOS) are presented. The FCOS aims at investigating the nature of the Ultra Compact Dwarf galaxies (UCDs) recently discovered in the center of the Fornax cluster (Drinkwater et al. 2000). 280 unresolved objects in the magnitude space covering UCDs and bright globular clusters (18<V<21 mag) were observed spectroscopically. 54 new Fornax members were discovered, plus five of the seven already known UCDs. Their distribution in radial velocity, colour, magnitude and space was investigated. It is found that bright compact objects (V<20 or M_V<-11.4 mag), including the UCDs, have a higher mean velocity than faint compact objects (V>20 mag) at 96% confidence. The mean velocity of the bright compact objects is consistent with that of the dwarf galaxy population in Fornax, but inconsistent with that of NGC 1399's globular cluster system at 93.5% confidence. The compact objects follow a colour magnitude relation with a slope very similar to that of normal dEs, but shifted about 0.2 mag redwards. The magnitude distribution of compact objects shows a fluent transition between UCDs and GCs with an overpopulation of 8 +/- 4 objects for V<20 mag with respect to the extrapolation of NGC 1399's GC luminosity function. The spatial distribution of bright compact objects is in comparison to the faint ones more extended at 88% confidence. All our findings are consistent with the threshing scenario (Bekki et al. 2003), suggesting that a substantial fraction of compact Fornax members brighter than V~20 mag could be created by threshing dE,Ns. Fainter than V~20 mag, the majority of the objects seem to be genuine GCs. Our results are also consistent with merged stellar super-clusters (Fellhauer & Kroupa 2002) as an alternative explanation for the bright compact objects.Comment: 15 pages, 11 figures, accepted for publication in A&

    COSMOGRAIL: the COSmological MOnitoring of GRAvItational Lenses XV. Assessing the achievability and precision of time-delay measurements

    Full text link
    COSMOGRAIL is a long-term photometric monitoring of gravitationally lensed QSOs aimed at implementing Refsdal's time-delay method to measure cosmological parameters, in particular H0. Given long and well sampled light curves of strongly lensed QSOs, time-delay measurements require numerical techniques whose quality must be assessed. To this end, and also in view of future monitoring programs or surveys such as the LSST, a blind signal processing competition named Time Delay Challenge 1 (TDC1) was held in 2014. The aim of the present paper, which is based on the simulated light curves from the TDC1, is double. First, we test the performance of the time-delay measurement techniques currently used in COSMOGRAIL. Second, we analyse the quantity and quality of the harvest of time delays obtained from the TDC1 simulations. To achieve these goals, we first discover time delays through a careful inspection of the light curves via a dedicated visual interface. Our measurement algorithms can then be applied to the data in an automated way. We show that our techniques have no significant biases, and yield adequate uncertainty estimates resulting in reduced chi2 values between 0.5 and 1.0. We provide estimates for the number and precision of time-delay measurements that can be expected from future time-delay monitoring campaigns as a function of the photometric signal-to-noise ratio and of the true time delay. We make our blind measurements on the TDC1 data publicly availableComment: 11 pages, 8 figures, published in Astronomy & Astrophysic

    Economic evaluation of the eradication program for bovine viral diarrhea in the Swiss dairy sector

    Get PDF
    The aim of this study was to conduct an economic evaluation of the BVD eradication program in the Swiss dairy sector. The situation before the start of the program (herd-level prevalence: 20%) served as a baseline scenario. Production models for three dairy farm types were used to estimate gross margins as well as net production losses and expenditures caused by BVD. The total economic benefit was estimated as the difference in disease costs between the baseline scenario and the implemented eradication program and was compared to the total eradication costs in a benefit-cost analysis. Data on the impact of BVD virus (BVDV) infection on animal health, fertility and production parameters were obtained empirically in a retrospective epidemiological case-control study in Swiss dairy herds and complemented by literature. Economic and additional production parameters were based on benchmarking data and published agricultural statistics. The eradication costs comprised the cumulative expenses for sampling and diagnostics. The economic model consisted of a stochastic simulation in @Risk for Excel with 20,000 iterations and was conducted for a time period of 14 years (2008–2021)
    • …
    corecore